11,167 research outputs found

    Weak order for the discretization of the stochastic heat equation driven by impulsive noise

    Full text link
    Considering a linear parabolic stochastic partial differential equation driven by impulsive space time noise, dX_t+AX_t dt= Q^{1/2}dZ_t, X_0=x_0\in H, t\in [0,T], we approximate the distribution of X_T. (Z_t)_{t\in[0,T]} is an impulsive cylindrical process and Q describes the spatial covariance structure of the noise; Tr(A^{-\alpha})0 and A^\beta Q is bounded for some \beta\in(\alpha-1,\alpha]. A discretization (X_h^n)_{n\in\{0,1,...,N\}} is defined via the finite element method in space (parameter h>0) and a \theta-method in time (parameter \Delta t=T/N). For \phi\in C^2_b(H;R) we show an integral representation for the error |E\phi(X^N_h)-E\phi(X_T)| and prove that |E\phi(X^N_h)-E\phi(X_T)|=O(h^{2\gamma}+(\Delta t)^{\gamma}) where \gamma<1-\alpha+\beta.Comment: 29 pages; Section 1 extended, new results in Appendix

    Elliptic Rydberg states as direction indicators

    Full text link
    The orientation in space of a Cartesian coordinate system can be indicated by the two vectorial constants of motion of a classical Keplerian orbit: the angular momentum and the Laplace-Runge-Lenz vector. In quantum mechanics, the states of a hydrogen atom that mimic classical elliptic orbits are the coherent states of the SO(4) rotation group.It is known how to produce these states experimentally. They have minimal dispersions of the two conserved vectors and can be used as direction indicators. We compare the fidelity of this transmission method with that of the idealized optimal method

    A Continuous Time GARCH Process Driven by a Lévy Process: Stationarity and Second Order Behaviour

    Get PDF
    We use a discrete time analysis, giving necessary and sufficient conditions for the almost sure convergence of ARCH(1) and GARCH(1,1) discrete time models, tosuggest an extension of the (G)ARCH concept to continuous time processes. Our "COGARCH" (continuous time GARCH) model, based on a single background driving Levy process, is different from, though related to, other continuous time stochastic volatility models that have been proposed. The model generalises the essential features of discrete time GARCH processes, and is amenable to further analysis, possessing useful Markovian and stationarity properties

    Continuous time volatility modelling: COGARCH versus Ornstein-Uhlenbeck models

    Get PDF
    We compare the probabilistic properties of the non-Gaussian Ornstein-Uhlenbeck based stochastic volatility model of Barndorff-Nielsen and Shephard (2001) with those of the COGARCH process. The latter is a continuous time GARCH process introduced by the authors (2004). Many features are shown to be shared by both processes, but differences are pointed out as well. Furthermore, it is shown that the COGARCH process has Pareto like tails under weak regularity conditions

    ATS-6 flight accelerometers

    Get PDF
    Five accelerometers mounted near the adapter base of the Titan 3-C launch vehicle and three on the hub of the ATS-F spacecraft provided (1) data for verifying basic spacecraft mode shapes and frequencies during powered flight while attached to the launch vehicle; (2) failure mode detection and diagnostic information on in-flight anomalies; and (3) data to be used in the design of future spacecraft to be flown on the Titan 3-C. Because data from the instruments mounted on the spacecraft hub passed through an in-flight disconnect at the separation plane between the transtage and ATS-F, the moment this connector was broken, the signal to the telemetry system showed a step function change. By monitoring these telemetry traces on the ground at appropriate times during flight sequences, a positive indication of spacecraft separation was obtained. Flight data showing dynamic response at spacecraft launch vehicle interface and at the top of ATS spacecraft during significant launch events are presented in tables

    Efficient algorithms for conditional independence inference

    Get PDF
    The topic of the paper is computer testing of (probabilistic) conditional independence (CI) implications by an algebraic method of structural imsets. The basic idea is to transform (sets of) CI statements into certain integral vectors and to verify by a computer the corresponding algebraic relation between the vectors, called the independence implication. We interpret the previous methods for computer testing of this implication from the point of view of polyhedral geometry. However, the main contribution of the paper is a new method, based on linear programming (LP). The new method overcomes the limitation of former methods to the number of involved variables. We recall/describe the theoretical basis for all four methods involved in our computational experiments, whose aim was to compare the efficiency of the algorithms. The experiments show that the LP method is clearly the fastest one. As an example of possible application of such algorithms we show that testing inclusion of Bayesian network structures or whether a CI statement is encoded in an acyclic directed graph can be done by the algebraic method

    Modelling bark beetle disturbances in a large scale forest scenario model to assess climate change impacts and evaluate adaptive management strategies

    Get PDF
    To study potential consequences of climate-induced changes in the biotic disturbance regime at regional to national scale we integrated a model of Ips typographus (L. Scol. Col.) damages into the large-scale forest scenario model EFISCEN. A two-stage multivariate statistical meta-model was used to upscale stand level damages by bark beetles as simulated in the hybrid forest patch model PICUS v1.41. Comparing EFISCEN simulations including the new bark beetle disturbance module against a 15-year damage time series for Austria showed good agreement at province level (R² between 0.496 and 0.802). A scenario analysis of climate change impacts on bark beetle-induced damages in Austria¿s Norway spruce [Picea abies (L.) Karst.] forests resulted in a strong increase in damages (from 1.33 Mm³ a¿1, period 1990¿2004, to 4.46 Mm³ a¿1, period 2095¿2099). Studying two adaptive management strategies (species change) revealed a considerable time-lag between the start of adaptation measures and a decrease in simulated damages by bark beetle

    Distributed control for COFS 1

    Get PDF
    An overview is given of the work being done at NASA LaRC on developing the Control of Flexible Structures (COFS) 1 Flight Experiment Baseline Control Law. This control law currently evolving to a generic control system software package designed to supply many, but not all, guest investigators. A system simulator is also described. It is currently being developed for COFS-1 and will be used to develop the Baseline Control Law and to evaluate guest investigator control schemes. It will be available for use whether or not control schemes fall into the category of the Baseline Control Law. First, the hardware configuration for control experiments is described. This is followed by a description of the simulation software. Open-loop sinusoid excitation time histories are next presented both with and without a local controller for the Linear DC Motor (LDCM) actuators currently planned for the flight. The generic control law follows and algorithm processing requirements are cited for a nominal case of interest. Finally, a closed-loop simulation study is presented, and the state of the work is summarized in the concluding remarks

    Optimization of a neutrino factory oscillation experiment

    Get PDF
    We discuss the optimization of a neutrino factory experiment for neutrino oscillation physics in terms of muon energy, baselines, and oscillation channels (gold, silver, platinum). In addition, we study the impact and requirements for detector technology improvements, and we compare the results to beta beams. We find that the optimized neutrino factory has two baselines, one at about 3000 to 5000km, the other at about 7500km (``magic'' baseline). The threshold and energy resolution of the golden channel detector have the most promising optimization potential. This, in turn, could be used to lower the muon energy from about 50GeV to about 20GeV. Furthermore, the inclusion of electron neutrino appearance with charge identification (platinum channel) could help for large values of \sin^2 2 \theta_{13}. Though tau neutrino appearance with charge identification (silver channel) helps, in principle, to resolve degeneracies for intermediate \sin^2 2 \theta_{13}, we find that alternative strategies may be more feasible in this parameter range. As far as matter density uncertainties are concerned, we demonstrate that their impact can be reduced by the combination of different baselines and channels. Finally, in comparison to beta beams and other alternative technologies, we clearly can establish a superior performance for a neutrino factory in the case \sin^2 2 \theta_{13} < 0.01.Comment: 51 pages, 25 figures, 6 tables, references corrected, final version to appear in Phys. Rev.
    corecore